Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization
نویسندگان
چکیده
For strongly convex objectives that are smooth, the classical theory of gradient descent ensures linear convergence relative to number evaluations. An analogous nonsmooth is challenging. Even when objective smooth at every iterate, corresponding local models unstable, and cutting planes invoked by traditional remedies difficult bound, leading guarantees sublinear cumulative We instead propose a multipoint generalization iteration for optimization. While our was designed with general in mind, we motivated “max-of-smooth” model captures subdifferential dimension optimality. prove itself max-of-smooth, experiments suggest more phenomenon.
منابع مشابه
Stochastic Coordinate Descent for Nonsmooth Convex Optimization
Stochastic coordinate descent, due to its practicality and efficiency, is increasingly popular in machine learning and signal processing communities as it has proven successful in several large-scale optimization problems , such as l1 regularized regression, Support Vector Machine, to name a few. In this paper, we consider a composite problem where the nonsmoothness has a general structure that...
متن کاملA Coordinate Gradient Descent Method for Nonsmooth Nonseparable Minimization
This paper presents a coordinate gradient descent approach for minimizing the sum of a smooth function and a nonseparable convex function. We find a search direction by solving a subproblem obtained by a second-order approximation of the smooth function and adding a separable convex function. Under a local Lipschitzian error bound assumption, we show that the algorithm possesses global and loca...
متن کاملA coordinate gradient descent method for nonsmooth separable minimization
This is a talk given at ISMP, Jul 31 2006.
متن کاملA Weighted Mirror Descent Algorithm for Nonsmooth Convex Optimization Problem
Large scale nonsmooth convex optimization is a common problem for a range of computational areas including machine learning and computer vision. Problems in these areas contain special domain structures and characteristics. Special treatment of such problem domains, exploiting their structures, can significantly improve the computational burden. We present a weighted Mirror Descent method to so...
متن کاملMutiple-gradient Descent Algorithm for Multiobjective Optimization
The steepest-descent method is a well-known and effective single-objective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multi-objective optimization by considering the concurrent minimization of n smooth criteria {J i } (i = 1,. .. , n). The novel algorithm is based on the following observation: consider a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Siam Journal on Optimization
سال: 2023
ISSN: ['1095-7189', '1052-6234']
DOI: https://doi.org/10.1137/21m1468450